Matrix

We shouldn't think of it as a list of scalars.

[m11...m13...m22...m31...m33]

Interpretation:

  • n column vectors in a real-valued M-dimensional space.
  • m row vectors in a real-valued N-dimensional space.
  • Transformations, each column are new basis vectors.

Identity/Diagonal Matrix:

[100010001]

Symmetrical matrix:

A matrix is simmetrical if??? ARE YOU SURE???

A=AT

  1. (AT)T=A: The transpose of a transpose matrix is the original matrix.
  2. (A+B)T=AT+BT: The transpose of the sum of two matrices is equal to the sum of their transposes.
  3. (AB)T=BTAT: The transpose of a product of two matrices is equal to the product of their transposes in reverse order.

Trace of a matrix

The trace is the sum of the diagonal elements.

trace(A)=iaii

Reduction operations (sum across rows)

Rows will disappear.

print(A, end='\n\n') # Possible to do reduction on the matrix (sum along rows) 
A_c = A.sum(axis=0, keepdims=False) # 3 (rows are canceled out) 
A_c.shape print(A_c)

Output:

[[0 1 2]
[3 4 5]
[6 7 8]]

[ 9 12 15]

You flatten the axes, so you get something that is 1x3. You flatten the rows.
The axis where you do the operation is said to be flattened.

Exam question

What is the geometric meaning of a point cloud and we want to translate it to the origin.

How do we translate this point cloud to the origin? We can compute the mean of the point cloud, in which dimensional space is this WHAT THE FUCK IS HE SAYING

We compute the mean of the point cloud and then we subtract it from all the point.
The new mean of the point cloud is 0.

For each point x and y, we subtract the meanX and meanY. (x, y) - (meanX, meanY)

Hamaard multiplication

We do multiplications element wise. In python, it is done by the operator "*", between matrices.

Matrices as linear map between spaces/Linear transformations

They actually map a space into another space. Or an input vector to an output vector.

We are gonna take the basis vectors, and we are gonna change them. Then the operation can be done for every other point/vector.

It's basically just a change of basis vectors.

We are given the vector: [57]we can also represent it as a linear combination of scaled basis vectors:

5i^+7j^

Now, the whole point of a linear transformation is to change the basis vectors, so we do:

i^=[32],i^=[21]

The result is:
5[32]+7[21]=[1510]+[147]

The linear combination of our scaled new basis vectors is then our transformed vector:

[1510]+[147]=[15+1410+7]=[293]

This transformation can be represented as a matrix:
A=[3221]Each column of the matrix is a new basis vector.

Alternative writing

Often, other people collapse the last part into a single vector whose elements are yet to be calculated.

Visual examples:

Sometimes they can induce severe distortion when the two vectors lie in the same direction.

Matrix multiplication

Now that we have established matrices as transformations, we can merge two matrices/transformations into just 1 matrix, that is called a composition of matrices.

Basically, it goes from right to left, so in this case we first apply rotation and then shear.
When we apply rotation, we have that:
j^=[01],i^=[10]Now assume j and i are vectors on a basic cartesian plane, with normal basis vectors.
We now transform the basis vectors of the new j and i, to see where they land, and that will be our final combination.
[1101][01],[1101][10]

Basically, we must substitute the basis vectors of the new j and i with the columns of the shear:

[1101][01]=0[10]+1[11]=[11]=j^

[1101][10]=1[10]+0[11]=[10]=i^

The result is the matrix:
[1110]Which is exactly the same as the composition in the image above.

Determinant

The determinant of a matrix/transformation is the factor by which any area is scaled by the transformation.

The determinant also allows for negative values.
Negative values happen when the space is "flipped", for example when the unit vectors cross each other.

A determinant of 0 is means that dimensions have collapsed into a single one, so for example a volume has been squished into a plane or an area has been squished into a single line.

This is what happens when the vectors/column of the matrix are not linearly indipendent.